Neural Network Projet¶
Team members : Maelwenn Labidurie & Albane Coiffe
Groupe DAI
24/07/25
I - Introduction¶
For this project, we chose to implement a deep convolutional neural network (Deep CNN) capable of detecting whether parking spots are occupied or free from camera images. Our main objective is to build a model that can be tested in real-world conditions, specifically, on the parking lot located just outside Maelwenn’s residence.
By using real-world footage, we aim to validate the robustness and practicality of our model beyond controlled datasets. This also opens the door to future applications in smart parking systems, where such a model could:
Automatically monitor parking lots using video streams,
Enable real-time reservation of available spots,
Integrate into connected mobility or smart city platforms.
Below is an image showing the view of the parking lot as seen from Maelwenn’s window.

For the contribution :
- Maelwenn handled the dataset creation and yolo model training.
- Albane handled the tensorflow model training and the model's analysis.
II - Deep CNN¶
We chose an object detection approach in order to learn how to use YOLO to identify the status of parking spots.
Rather than classifying the entire image (e.g., as “full parking lot” or “empty parking lot”), our goal is for the model to detect each individual parking spot and determine whether it is occupied or free.
III - Librairy importation¶
import os
import cv2
from matplotlib import pyplot as plt
import pandas as pd
from collections import Counter
import random
import numpy as np
import math
import time
IV - Test with tensorflow¶
As seen in class, TensorFlow provides a convenient way to test pre-trained object detection models via TensorFlow Hub. Before fine-tuning YOLO, we wanted to try a baseline test using these off-the-shelf models.
We aim to evaluate how well a general-purpose object detection model (trained on OpenImages) can detect cars and distinguish occupied or empty parking spots — even though these classes are not specifically trained for that.
The goal is to:
- Understand how object detection inference works in TensorFlow.
- Evaluate the limitations of pre-trained general models.
- Use this baseline to justify the need for a customized YOLO model later.
We use the ssd_mobilenet_v2 model from TF Hub and run inference on a real photo taken from the window of Maelwenn's residence (our target parking lot). We resize the image, prepare it for inference, and draw the bounding boxes using standard utilities from the course.
Let’s run the detector and visualize the results.
# For running inference on the TF-Hub module.
import tensorflow as tf
import tensorflow_hub as hub
# For downloading the image.
import matplotlib.pyplot as plt
import tempfile
from six.moves.urllib.request import urlopen
from six import BytesIO
# For drawing onto the image.
from PIL import Image
from PIL import ImageColor
from PIL import ImageDraw
from PIL import ImageFont
from PIL import ImageOps
# Print Tensorflow version
print(tf.__version__)
# Check available GPU devices.
print("The following GPU devices are available: %s" % tf.test.gpu_device_name())
2.18.0 The following GPU devices are available: /device:GPU:0
Choosing the ssd_mobilenet_v2 model
Loading it, and choosing the default signatures
# ssd mobilenet version 2
module_handle = "https://tfhub.dev/google/openimages_v4/ssd/mobilenet_v2/1"
model_od = hub.load(module_handle)
# take a look at the available signatures for this particular model
model_od.signatures.keys()
KeysView(_SignatureMap({'default': <ConcreteFunction () -> Dict[['detection_class_labels', TensorSpec(shape=(None, 1), dtype=tf.int64, name=None)], ['detection_class_names', TensorSpec(shape=(None, 1), dtype=tf.string, name=None)], ['detection_class_entities', TensorSpec(shape=(None, 1), dtype=tf.string, name=None)], ['detection_boxes', TensorSpec(shape=(None, 4), dtype=tf.float32, name=None)], ['detection_scores', TensorSpec(shape=(None, 1), dtype=tf.float32, name=None)]] at 0x7D116AAFC5D0>}))
detector = model_od.signatures['default']
Using the same function that in the lab to display and resize images (here from a path and not an url)
def display_image(image):
"""
Displays an image inside the notebook.
This is used by download_and_resize_image()
"""
fig = plt.figure(figsize=(20, 15))
plt.grid(False)
plt.imshow(image)
def resize_image_from_path(path, new_width=256, new_height=256, display=False):
'''
Loads an image from a local file, resizes it, and saves it to a temporary location.
Args:
path (string) -- local path to the image
new_width (int) -- size in pixels used for resizing the width of the image
new_height (int) -- size in pixels used for resizing the height of the image
display (bool) -- if True, display the resized image in the notebook
Returns:
(string) -- path to the saved resized image
'''
# create a temporary file ending with ".jpg"
_, filename = tempfile.mkstemp(suffix=".jpg")
# open the image from local path
pil_image = Image.open(path)
# resize and crop to match the desired dimensions
pil_image = ImageOps.fit(pil_image, (new_width, new_height), Image.LANCZOS)
# convert to RGB color space
pil_image_rgb = pil_image.convert("RGB")
# save to the temporary file
pil_image_rgb.save(filename, format="JPEG", quality=90)
print("Image resized and saved to %s." % filename)
if display:
display_image(pil_image)
return filename
Loading the test image
image_path = "./img/1.jpg"
downloaded_image_path = resize_image_from_path(image_path, new_width=640, new_height=320, display=True)
Image resized and saved to /tmp/tmpihgi7d84.jpg.
Still using the same function that in the lab, here to draw the bounding boxes found by the model and from running the model on the chosen image.
from PIL import Image, ImageColor, ImageDraw, ImageFont
def draw_bounding_box_on_image(image,
ymin,
xmin,
ymax,
xmax,
color,
font,
thickness=4,
display_str_list=()):
"""
Ajoute une boîte englobante (bounding box) et, si besoin,
une ou plusieurs étiquettes (display_str_list) sur l'image PIL.
Args:
image : PIL.Image.Image (modifié sur place).
ymin, xmin, ymax, xmax : coordonnées normalisées [0..1].
color : nom ou tuple RGB pour la couleur de la boîte.
font : instance de ImageFont.FreeTypeFont ou font par défaut.
thickness : épaisseur du contour de la boîte.
display_str_list : liste de chaînes à afficher (une par ligne).
Cette fonction modifie directement `image` et n'a pas de retour.
"""
draw = ImageDraw.Draw(image)
im_width, im_height = image.size
# Calculer les coordonnées en pixels
left = xmin * im_width
right = xmax * im_width
top = ymin * im_height
bottom = ymax * im_height
# Dessiner le contour de la boîte
draw.line(
[(left, top), (left, bottom), (right, bottom), (right, top), (left, top)],
width=thickness,
fill=color
)
# Calculer la hauteur totale des étiquettes avant de les dessiner
# Pour chaque display_str, on mesure (width, height) via getbbox()
display_str_heights = []
for ds in display_str_list:
bbox = font.getbbox(ds)
# bbox retourne (x0, y0, x1, y1) ; height = y1 - y0
height = bbox[3] - bbox[1]
display_str_heights.append(height)
total_display_str_height = (1 + 2 * 0.05) * sum(display_str_heights)
# Position verticale de départ pour dessiner le texte (au‐dessus ou au‐dessous de la boîte)
if top > total_display_str_height:
text_bottom = top
else:
text_bottom = top + total_display_str_height
# Dessiner les lignes de texte, de bas en haut
for display_str in display_str_list[::-1]:
bbox = font.getbbox(display_str)
text_width = bbox[2] - bbox[0]
text_height = bbox[3] - bbox[1]
margin = np.ceil(0.05 * text_height)
# Fond rectangulaire derrière le texte
draw.rectangle(
[
(left, text_bottom - text_height - 2 * margin),
(left + text_width, text_bottom)
],
fill=color
)
# Texte noir sur fond coloré
draw.text(
(left + margin, text_bottom - text_height - margin),
display_str,
fill="black",
font=font
)
text_bottom -= text_height + 2 * margin
def draw_boxes(image, boxes, class_names, scores, max_boxes=10, min_score=0.1):
"""
Superpose sur une image (numpy array) les boîtes détectées avec leur
étiquette (classe + pourcentage). Retourne l'image modifiée (numpy array).
Args:
image : numpy array shape=(H,W,3), dtype uint8 ou float32 [0..255/1].
boxes : array shape=(N,4), coordonnées normalisées [ymin, xmin, ymax, xmax].
class_names : liste/array de bytes ou str, longueur N.
scores : array shape=(N,), confiance [0..1].
max_boxes : nombre max de boîtes à dessiner.
min_score : seuil minimal pour dessiner une boîte.
"""
colors = list(ImageColor.colormap.values())
try:
font = ImageFont.truetype(
"/usr/share/fonts/truetype/liberation/LiberationSansNarrow-Regular.ttf",
25
)
except IOError:
print("Police non trouvée, utilisation de la police par défaut.")
font = ImageFont.load_default()
# On travaille sur une copie PIL
image_pil = Image.fromarray(np.uint8(image)).convert("RGB")
for i in range(min(boxes.shape[0], max_boxes)):
if scores[i] < min_score:
continue
ymin, xmin, ymax, xmax = boxes[i]
class_name = class_names[i]
if isinstance(class_name, bytes):
class_name = class_name.decode("ascii")
display_str = f"{class_name}: {int(100 * scores[i])}%"
color = colors[hash(class_names[i]) % len(colors)]
# On dessine la boîte + l'étiquette
draw_bounding_box_on_image(
image_pil,
ymin, xmin, ymax, xmax,
color=color,
font=font,
display_str_list=[display_str]
)
# Retourner sous forme de numpy array
return np.array(image_pil)
def load_img(path):
'''
Loads a JPEG image and converts it to a tensor.
Args:
path (string) -- path to a locally saved JPEG image
Returns:
(tensor) -- an image tensor
'''
# read the file
img = tf.io.read_file(path)
# convert to a tensor
img = tf.image.decode_jpeg(img, channels=3)
return img
def run_detector(detector, path):
'''
Runs inference on a local file using an object detection model.
Args:
detector (model) -- an object detection model loaded from TF Hub
path (string) -- path to an image saved locally
'''
# load an image tensor from a local file path
img = load_img(path)
# add a batch dimension in front of the tensor
converted_img = tf.image.convert_image_dtype(img, tf.float32)[tf.newaxis, ...]
# run inference using the model
start_time = time.time()
result = detector(converted_img)
end_time = time.time()
# save the results in a dictionary
result = {key:value.numpy() for key,value in result.items()}
# print results
print("Found %d objects." % len(result["detection_scores"]))
print("Inference time: ", end_time-start_time)
print("result detection_boxes: ", result["detection_boxes"])
print("detection_class_entities: ", result["detection_class_entities"])
print("detection_scores: ", result["detection_scores"])
# draw predicted boxes over the image
image_with_boxes = draw_boxes(
img.numpy(), result["detection_boxes"],
result["detection_class_entities"], result["detection_scores"])
# display the image
display_image(image_with_boxes)
Using the previous function to run the model on our image
# runs the object detection model and prints information about the objects found
run_detector(detector, downloaded_image_path)
Found 100 objects. Inference time: 0.2516653537750244 result detection_boxes: [[8.69755447e-03 7.24945903e-01 2.63081372e-01 9.24918056e-01] [7.29504973e-03 6.36033654e-01 2.49938279e-01 7.83184409e-01] [3.29551101e-03 6.21069193e-01 3.13380271e-01 9.39808249e-01] [3.11418623e-03 6.93185389e-01 2.28813171e-01 8.83774221e-01] [3.95245850e-04 5.95370293e-01 2.29158163e-01 7.48238087e-01] [0.00000000e+00 6.71693444e-01 1.77005231e-01 7.51847029e-01] [6.42802477e-01 2.01833025e-02 9.63489771e-01 1.45565540e-01] [1.23329982e-02 2.31717318e-01 2.36987144e-01 3.98204893e-01] [3.80145609e-02 6.00102305e-01 3.26930344e-01 7.18251109e-01] [7.71425664e-04 6.34164274e-01 1.37897015e-01 6.93388760e-01] [2.62537003e-02 8.78219426e-01 1.17561802e-01 9.84408438e-01] [0.00000000e+00 5.99888861e-01 1.79209739e-01 8.45371664e-01] [3.63484174e-02 8.20589364e-01 3.01819801e-01 9.98789728e-01] [0.00000000e+00 7.77631760e-01 1.75655186e-01 9.56844807e-01] [3.80652398e-03 7.69159079e-01 1.69139832e-01 8.72940302e-01] [0.00000000e+00 7.68935978e-01 3.32434773e-01 9.91994441e-01] [1.13189965e-02 1.57487690e-01 2.29907915e-01 3.30374479e-01] [7.34309852e-02 8.78962457e-01 1.78142697e-01 9.87654030e-01] [0.00000000e+00 6.46952331e-01 8.86971354e-02 9.44933116e-01] [8.91832262e-03 3.03573728e-01 2.19265640e-01 4.57440972e-01] [1.96990743e-03 6.15452826e-01 7.41636008e-02 6.83306038e-01] [4.96898144e-02 6.82177484e-01 2.13010743e-01 7.57935345e-01] [0.00000000e+00 4.34789538e-01 3.32849085e-01 9.64465737e-01] [4.99189645e-02 6.47152841e-01 3.39995980e-01 7.80288994e-01] [9.09239277e-02 9.47359562e-01 1.89040631e-01 9.85353351e-01] [0.00000000e+00 5.34213662e-01 1.95671737e-01 9.76743937e-01] [3.40427086e-03 7.31890738e-01 5.57962731e-02 8.03882897e-01] [7.10513517e-02 7.86300719e-01 2.05914110e-01 8.63960445e-01] [8.23666155e-03 7.27782309e-01 1.82212010e-01 7.98406065e-01] [0.00000000e+00 6.43787026e-01 7.52652049e-01 9.90187526e-01] [0.00000000e+00 8.42240930e-01 1.58979505e-01 1.00000000e+00] [1.72995776e-03 4.46951449e-01 2.44432956e-01 5.81153572e-01] [2.74635106e-03 6.69854283e-01 7.12238029e-02 7.59526372e-01] [1.89854205e-03 9.76479873e-02 2.20404521e-01 2.75527358e-01] [4.10443544e-03 3.66239190e-01 2.08054289e-01 5.23785770e-01] [0.00000000e+00 7.37342358e-01 9.25777927e-02 9.88991141e-01] [0.00000000e+00 5.75420558e-02 1.80946559e-01 3.92743707e-01] [3.81797180e-03 5.82529306e-01 7.48366117e-02 6.36312127e-01] [0.00000000e+00 5.29411554e-01 2.27435887e-01 6.81131005e-01] [0.00000000e+00 3.52885991e-01 7.95706362e-02 6.37766123e-01] [6.56308234e-03 9.46215093e-01 1.37175858e-01 9.89887893e-01] [0.00000000e+00 6.73359513e-01 2.46672764e-01 9.97542262e-01] [3.52566317e-03 8.70576382e-01 7.06879944e-02 9.62857008e-01] [2.81006098e-04 7.54292369e-01 6.35663792e-02 8.80858898e-01] [0.00000000e+00 1.47351518e-01 1.63696796e-01 3.21537197e-01] [7.43499771e-03 5.11632502e-01 7.11566508e-02 5.84924519e-01] [0.00000000e+00 2.08424628e-01 2.78037161e-01 9.79806721e-01] [0.00000000e+00 5.40597364e-02 1.48834094e-01 2.15901196e-01] [0.00000000e+00 8.05106163e-02 1.04050159e-01 3.73307735e-01] [5.47385290e-02 7.28951573e-01 2.02628136e-01 7.86819100e-01] [0.00000000e+00 3.37144464e-01 1.68052614e-01 9.89789128e-01] [0.00000000e+00 1.59983218e-01 1.79370940e-01 6.95007861e-01] [0.00000000e+00 5.51482439e-01 3.98154914e-01 8.89447212e-01] [0.00000000e+00 7.28916168e-01 3.53499532e-01 9.83690262e-01] [6.35731965e-03 4.79797602e-01 2.56211489e-01 6.18789911e-01] [0.00000000e+00 1.52445912e-01 3.02624196e-01 4.83286381e-01] [1.69721246e-02 8.12550664e-01 4.49345917e-01 9.97986317e-01] [0.00000000e+00 5.28787494e-01 8.76633897e-02 8.02487373e-01] [6.40100017e-02 6.69982016e-01 2.70872802e-01 9.06325400e-01] [7.62531638e-01 8.30599546e-01 9.68601704e-01 9.95333314e-01] [0.00000000e+00 5.57901263e-02 3.28897655e-01 5.84329128e-01] [6.47639751e-01 1.55989155e-02 9.62607861e-01 1.47956014e-01] [0.00000000e+00 5.34483910e-01 3.62419784e-01 7.66707540e-01] [1.62289381e-01 9.57611620e-01 2.14827061e-01 9.93687212e-01] [4.78218079e-01 6.60824001e-01 5.19205332e-01 6.83997214e-01] [4.19617444e-03 8.26403320e-01 7.20751062e-02 9.45800364e-01] [1.23061985e-03 1.16680101e-01 8.97436067e-02 2.36298457e-01] [5.03631309e-03 7.95758963e-02 7.98201114e-02 1.90401614e-01] [7.99945116e-01 8.67303848e-01 8.35856915e-01 8.97304296e-01] [0.00000000e+00 2.86887884e-02 9.90177691e-02 2.94020832e-01] [0.00000000e+00 4.23735708e-01 1.75560087e-01 5.79533041e-01] [6.57932982e-02 5.99512577e-01 2.76903987e-01 8.37599635e-01] [4.01665270e-03 1.40275776e-01 9.07140449e-02 4.25562799e-01] [1.14718620e-02 4.13097441e-01 7.34796897e-02 4.91441011e-01] [0.00000000e+00 4.73405778e-01 2.34459102e-01 8.37598503e-01] [8.90863180e-01 5.63613959e-02 9.60744500e-01 1.14598215e-01] [0.00000000e+00 3.18789065e-01 1.40333444e-01 6.54888272e-01] [0.00000000e+00 4.77681667e-01 1.60607129e-01 6.33607745e-01] [7.96847820e-01 7.22204626e-01 9.54586506e-01 9.97572124e-01] [0.00000000e+00 2.45297432e-01 2.33268648e-01 6.38029218e-01] [5.93965873e-03 4.74607617e-01 6.98409826e-02 5.57821155e-01] [9.03415442e-01 1.63322873e-02 9.64036584e-01 4.17105593e-02] [7.38460064e-01 5.44366598e-01 9.67550993e-01 6.92849755e-01] [2.25806311e-02 8.52601230e-01 1.29676431e-01 9.16238725e-01] [7.84036636e-01 9.11240220e-01 1.00000000e+00 1.00000000e+00] [0.00000000e+00 1.50760636e-01 2.35082299e-01 5.29330611e-01] [9.22498107e-03 8.88443589e-01 2.75027812e-01 9.98825669e-01] [6.15106523e-03 8.47619772e-01 2.40144283e-01 9.97855306e-01] [2.91712582e-03 7.89683759e-01 1.70272917e-01 1.00000000e+00] [0.00000000e+00 2.48655424e-01 1.69812739e-01 5.25134861e-01] [4.40203696e-01 7.09617555e-01 4.77741212e-01 7.36491501e-01] [2.27898508e-02 6.53148532e-01 2.72892118e-01 1.00000000e+00] [6.95352256e-03 5.27500033e-01 2.13838756e-01 7.85329938e-01] [4.75816816e-01 5.91737270e-01 5.31318784e-01 6.28106713e-01] [0.00000000e+00 5.27988493e-01 5.26909173e-01 9.58978117e-01] [8.83638680e-01 8.40798438e-01 9.46936667e-01 9.00305569e-01] [8.93500030e-01 8.05767238e-01 9.47061360e-01 8.39653552e-01] [7.22581804e-01 7.36268282e-01 9.50943768e-01 9.17924285e-01] [1.72178261e-03 3.53314966e-01 4.05779257e-02 4.08973366e-01] [6.97313994e-03 2.06691861e-01 8.07308108e-02 5.11792183e-01]] detection_class_entities: [b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Boat' b'Tree' b'Tree' b'Tree' b'Window' b'Tree' b'House' b'Tree' b'Tree' b'House' b'Tree' b'Window' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Window' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Window' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Plant' b'Tree' b'Tree' b'Tree' b'Tree' b'House' b'Tree' b'Tree' b'Tree' b'Tree' b'Car' b'Tree' b'Window' b'Plant' b'Tree' b'Tree' b'Tree' b'Plant' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Tree' b'Wheel' b'Tree' b'Tree' b'Plant' b'Plant' b'Tree' b'Wheel' b'Wheel' b'Tree' b'Tree' b'Plant' b'House' b'Tree' b'House' b'Tree' b'Plant' b'House' b'Tree' b'Plant' b'Tree' b'Plant' b'Plant' b'Tree' b'Tree' b'Tree'] detection_scores: [0.20012365 0.17033884 0.16635855 0.1518623 0.1374095 0.13249311 0.13187763 0.13153335 0.12906657 0.12593295 0.12187588 0.12158918 0.12138606 0.12081669 0.1192515 0.11656944 0.11491749 0.11430708 0.11315157 0.11093705 0.11068709 0.11027712 0.10992733 0.10912153 0.10907275 0.1083324 0.10787389 0.10776926 0.10581525 0.10500794 0.10413029 0.10165618 0.10099932 0.10063073 0.09967192 0.09905269 0.09824304 0.0976475 0.09725534 0.09724431 0.09630373 0.09612472 0.09592433 0.09537536 0.09534332 0.09488963 0.09467547 0.09456439 0.09450932 0.0941174 0.09405137 0.09387773 0.09302759 0.09292284 0.09268906 0.0914122 0.09121326 0.09119244 0.09114416 0.0906247 0.09040041 0.09010251 0.0900118 0.08993856 0.08953986 0.0880099 0.08735021 0.08709931 0.0867788 0.08671597 0.08576363 0.08546729 0.08526129 0.08372049 0.08270101 0.08252417 0.08246335 0.08223901 0.08193998 0.08133823 0.08117708 0.08104221 0.08068981 0.08065695 0.08064122 0.08017565 0.07917201 0.07783987 0.07778592 0.07750511 0.07740445 0.07697158 0.07689672 0.07682189 0.07654836 0.076345 0.07607803 0.07521722 0.07489499 0.07478596]
Conclusion: Why we moved away from TF Hub¶
As expected, the pre-trained model was able to detect generic objects, but with poor accuracy and many false positives.
In our case:
- It completely failed to detect parking spots.
- It even mistook a car for a boat (see image).
- It predicted multiple irrelevant bounding boxes, including dozens for a single tree.
These models were trained on large, general datasets (like OpenImages) and are not adapted to our specific problem: detecting empty vs. occupied parking spots.
Conclusion: This test highlights the limits of generic object detection models for domain-specific applications. That’s why we decided to move forward with YOLO, fine-tuned on a custom dataset of our real parking lot, with specific labels for empty and occupied spaces.
Installation of required library
!pip install roboflow ultralytics
Collecting roboflow Downloading roboflow-1.2.1-py3-none-any.whl.metadata (9.7 kB) Collecting ultralytics Downloading ultralytics-8.3.168-py3-none-any.whl.metadata (37 kB) Requirement already satisfied: certifi in /usr/local/lib/python3.11/dist-packages (from roboflow) (2025.7.14) Collecting idna==3.7 (from roboflow) Downloading idna-3.7-py3-none-any.whl.metadata (9.9 kB) Requirement already satisfied: cycler in /usr/local/lib/python3.11/dist-packages (from roboflow) (0.12.1) Requirement already satisfied: kiwisolver>=1.3.1 in /usr/local/lib/python3.11/dist-packages (from roboflow) (1.4.8) Requirement already satisfied: matplotlib in /usr/local/lib/python3.11/dist-packages (from roboflow) (3.10.0) Requirement already satisfied: numpy>=1.18.5 in /usr/local/lib/python3.11/dist-packages (from roboflow) (2.0.2) Collecting opencv-python-headless==4.10.0.84 (from roboflow) Downloading opencv_python_headless-4.10.0.84-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (20 kB) Requirement already satisfied: Pillow>=7.1.2 in /usr/local/lib/python3.11/dist-packages (from roboflow) (11.2.1) Collecting pillow-heif<2 (from roboflow) Downloading pillow_heif-1.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (9.6 kB) Collecting pillow-avif-plugin<2 (from roboflow) Downloading pillow_avif_plugin-1.5.2-cp311-cp311-manylinux_2_28_x86_64.whl.metadata (2.1 kB) Requirement already satisfied: python-dateutil in /usr/local/lib/python3.11/dist-packages (from roboflow) (2.9.0.post0) Collecting python-dotenv (from roboflow) Downloading python_dotenv-1.1.1-py3-none-any.whl.metadata (24 kB) Requirement already satisfied: requests in /usr/local/lib/python3.11/dist-packages (from roboflow) (2.32.3) Requirement already satisfied: six in /usr/local/lib/python3.11/dist-packages (from roboflow) (1.17.0) Requirement already satisfied: urllib3>=1.26.6 in /usr/local/lib/python3.11/dist-packages (from roboflow) (2.4.0) Requirement already satisfied: tqdm>=4.41.0 in /usr/local/lib/python3.11/dist-packages (from roboflow) (4.67.1) Requirement already satisfied: PyYAML>=5.3.1 in /usr/local/lib/python3.11/dist-packages (from roboflow) (6.0.2) Requirement already satisfied: requests-toolbelt in /usr/local/lib/python3.11/dist-packages (from roboflow) (1.0.0) Collecting filetype (from roboflow) Downloading filetype-1.2.0-py2.py3-none-any.whl.metadata (6.5 kB) Requirement already satisfied: opencv-python>=4.6.0 in /usr/local/lib/python3.11/dist-packages (from ultralytics) (4.11.0.86) Requirement already satisfied: scipy>=1.4.1 in /usr/local/lib/python3.11/dist-packages (from ultralytics) (1.15.3) Requirement already satisfied: torch>=1.8.0 in /usr/local/lib/python3.11/dist-packages (from ultralytics) (2.6.0+cu124) Requirement already satisfied: torchvision>=0.9.0 in /usr/local/lib/python3.11/dist-packages (from ultralytics) (0.21.0+cu124) Requirement already satisfied: psutil in /usr/local/lib/python3.11/dist-packages (from ultralytics) (5.9.5) Requirement already satisfied: py-cpuinfo in /usr/local/lib/python3.11/dist-packages (from ultralytics) (9.0.0) Requirement already satisfied: pandas>=1.1.4 in /usr/local/lib/python3.11/dist-packages (from ultralytics) (2.2.2) Collecting ultralytics-thop>=2.0.0 (from ultralytics) Downloading ultralytics_thop-2.0.14-py3-none-any.whl.metadata (9.4 kB) Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.11/dist-packages (from matplotlib->roboflow) (1.3.2) Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.11/dist-packages (from matplotlib->roboflow) (4.58.5) Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.11/dist-packages (from matplotlib->roboflow) (25.0) Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.11/dist-packages (from matplotlib->roboflow) (3.2.3) Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.11/dist-packages (from pandas>=1.1.4->ultralytics) (2025.2) Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.11/dist-packages (from pandas>=1.1.4->ultralytics) (2025.2) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/dist-packages (from requests->roboflow) (3.4.2) Requirement already satisfied: filelock in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (3.18.0) Requirement already satisfied: typing-extensions>=4.10.0 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (4.14.1) Requirement already satisfied: networkx in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (3.5) Requirement already satisfied: jinja2 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (3.1.6) Requirement already satisfied: fsspec in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (2025.3.2) Collecting nvidia-cuda-nvrtc-cu12==12.4.127 (from torch>=1.8.0->ultralytics) Downloading nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cuda-runtime-cu12==12.4.127 (from torch>=1.8.0->ultralytics) Downloading nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cuda-cupti-cu12==12.4.127 (from torch>=1.8.0->ultralytics) Downloading nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cudnn-cu12==9.1.0.70 (from torch>=1.8.0->ultralytics) Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cublas-cu12==12.4.5.8 (from torch>=1.8.0->ultralytics) Downloading nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cufft-cu12==11.2.1.3 (from torch>=1.8.0->ultralytics) Downloading nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-curand-cu12==10.3.5.147 (from torch>=1.8.0->ultralytics) Downloading nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting nvidia-cusolver-cu12==11.6.1.9 (from torch>=1.8.0->ultralytics) Downloading nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Collecting nvidia-cusparse-cu12==12.3.1.170 (from torch>=1.8.0->ultralytics) Downloading nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB) Requirement already satisfied: nvidia-cusparselt-cu12==0.6.2 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (0.6.2) Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (2.21.5) Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (12.4.127) Collecting nvidia-nvjitlink-cu12==12.4.127 (from torch>=1.8.0->ultralytics) Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB) Requirement already satisfied: triton==3.2.0 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (3.2.0) Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.11/dist-packages (from torch>=1.8.0->ultralytics) (1.13.1) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.11/dist-packages (from sympy==1.13.1->torch>=1.8.0->ultralytics) (1.3.0) Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.11/dist-packages (from jinja2->torch>=1.8.0->ultralytics) (3.0.2) Downloading roboflow-1.2.1-py3-none-any.whl (86 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 86.9/86.9 kB 3.8 MB/s eta 0:00:00 Downloading idna-3.7-py3-none-any.whl (66 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 66.8/66.8 kB 5.9 MB/s eta 0:00:00 Downloading opencv_python_headless-4.10.0.84-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (49.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.9/49.9 MB 50.1 MB/s eta 0:00:00 Downloading ultralytics-8.3.168-py3-none-any.whl (1.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 58.1 MB/s eta 0:00:00 Downloading pillow_avif_plugin-1.5.2-cp311-cp311-manylinux_2_28_x86_64.whl (4.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.2/4.2 MB 113.0 MB/s eta 0:00:00 Downloading pillow_heif-1.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.9/4.9 MB 120.3 MB/s eta 0:00:00 Downloading nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_x86_64.whl (363.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 363.4/363.4 MB 2.9 MB/s eta 0:00:00 Downloading nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (13.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.8/13.8 MB 121.2 MB/s eta 0:00:00 Downloading nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (24.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 24.6/24.6 MB 93.9 MB/s eta 0:00:00 Downloading nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (883 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 883.7/883.7 kB 57.0 MB/s eta 0:00:00 Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl (664.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 664.8/664.8 MB 2.1 MB/s eta 0:00:00 Downloading nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_x86_64.whl (211.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 211.5/211.5 MB 4.9 MB/s eta 0:00:00 Downloading nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_x86_64.whl (56.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.3/56.3 MB 44.2 MB/s eta 0:00:00 Downloading nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_x86_64.whl (127.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 127.9/127.9 MB 20.7 MB/s eta 0:00:00 Downloading nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_x86_64.whl (207.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 207.5/207.5 MB 4.3 MB/s eta 0:00:00 Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (21.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.1/21.1 MB 98.0 MB/s eta 0:00:00 Downloading ultralytics_thop-2.0.14-py3-none-any.whl (26 kB) Downloading filetype-1.2.0-py2.py3-none-any.whl (19 kB) Downloading python_dotenv-1.1.1-py3-none-any.whl (20 kB) Installing collected packages: pillow-avif-plugin, filetype, python-dotenv, pillow-heif, opencv-python-headless, nvidia-nvjitlink-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, idna, nvidia-cusparse-cu12, nvidia-cudnn-cu12, nvidia-cusolver-cu12, roboflow, ultralytics-thop, ultralytics Attempting uninstall: opencv-python-headless Found existing installation: opencv-python-headless 4.12.0.88 Uninstalling opencv-python-headless-4.12.0.88: Successfully uninstalled opencv-python-headless-4.12.0.88 Attempting uninstall: nvidia-nvjitlink-cu12 Found existing installation: nvidia-nvjitlink-cu12 12.5.82 Uninstalling nvidia-nvjitlink-cu12-12.5.82: Successfully uninstalled nvidia-nvjitlink-cu12-12.5.82 Attempting uninstall: nvidia-curand-cu12 Found existing installation: nvidia-curand-cu12 10.3.6.82 Uninstalling nvidia-curand-cu12-10.3.6.82: Successfully uninstalled nvidia-curand-cu12-10.3.6.82 Attempting uninstall: nvidia-cufft-cu12 Found existing installation: nvidia-cufft-cu12 11.2.3.61 Uninstalling nvidia-cufft-cu12-11.2.3.61: Successfully uninstalled nvidia-cufft-cu12-11.2.3.61 Attempting uninstall: nvidia-cuda-runtime-cu12 Found existing installation: nvidia-cuda-runtime-cu12 12.5.82 Uninstalling nvidia-cuda-runtime-cu12-12.5.82: Successfully uninstalled nvidia-cuda-runtime-cu12-12.5.82 Attempting uninstall: nvidia-cuda-nvrtc-cu12 Found existing installation: nvidia-cuda-nvrtc-cu12 12.5.82 Uninstalling nvidia-cuda-nvrtc-cu12-12.5.82: Successfully uninstalled nvidia-cuda-nvrtc-cu12-12.5.82 Attempting uninstall: nvidia-cuda-cupti-cu12 Found existing installation: nvidia-cuda-cupti-cu12 12.5.82 Uninstalling nvidia-cuda-cupti-cu12-12.5.82: Successfully uninstalled nvidia-cuda-cupti-cu12-12.5.82 Attempting uninstall: nvidia-cublas-cu12 Found existing installation: nvidia-cublas-cu12 12.5.3.2 Uninstalling nvidia-cublas-cu12-12.5.3.2: Successfully uninstalled nvidia-cublas-cu12-12.5.3.2 Attempting uninstall: idna Found existing installation: idna 3.10 Uninstalling idna-3.10: Successfully uninstalled idna-3.10 Attempting uninstall: nvidia-cusparse-cu12 Found existing installation: nvidia-cusparse-cu12 12.5.1.3 Uninstalling nvidia-cusparse-cu12-12.5.1.3: Successfully uninstalled nvidia-cusparse-cu12-12.5.1.3 Attempting uninstall: nvidia-cudnn-cu12 Found existing installation: nvidia-cudnn-cu12 9.3.0.75 Uninstalling nvidia-cudnn-cu12-9.3.0.75: Successfully uninstalled nvidia-cudnn-cu12-9.3.0.75 Attempting uninstall: nvidia-cusolver-cu12 Found existing installation: nvidia-cusolver-cu12 11.6.3.83 Uninstalling nvidia-cusolver-cu12-11.6.3.83: Successfully uninstalled nvidia-cusolver-cu12-11.6.3.83 Successfully installed filetype-1.2.0 idna-3.7 nvidia-cublas-cu12-12.4.5.8 nvidia-cuda-cupti-cu12-12.4.127 nvidia-cuda-nvrtc-cu12-12.4.127 nvidia-cuda-runtime-cu12-12.4.127 nvidia-cudnn-cu12-9.1.0.70 nvidia-cufft-cu12-11.2.1.3 nvidia-curand-cu12-10.3.5.147 nvidia-cusolver-cu12-11.6.1.9 nvidia-cusparse-cu12-12.3.1.170 nvidia-nvjitlink-cu12-12.4.127 opencv-python-headless-4.10.0.84 pillow-avif-plugin-1.5.2 pillow-heif-1.0.0 python-dotenv-1.1.1 roboflow-1.2.1 ultralytics-8.3.168 ultralytics-thop-2.0.14
At first, we intended to use the following public dataset: link.
This dataset contains images captured from parking lot surveillance cameras, with multiple parking spaces visible in each image.
Each image is associated with a .txt file located in the labels folder, containing annotations in the YOLOv5 format.
In the .txt files located under labels, each line corresponds to one detected object, with 5 values:
<class_id> <x_center> <y_center> <width> <height>
| Element | Description |
|---|---|
class_id |
Class index (e.g., 0 = occupied, 1 = free) |
x_center |
Center of the bounding box, X-axis (normalized value) |
y_center |
Center of the bounding box, Y-axis |
width |
Width of the bounding box (normalized) |
height |
Height of the bounding box (normalized) |
The name of the .txt file matches the name of its corresponding image file.
The goal is to train YOLOv5 on this dataset so that, given a new image, the model can:
Locate all visible parking spaces, and
Predict whether each one is free or occupied.
Data process¶
DATASET_DIR = "PKLot.v2-640.yolov5pytorch"
SPLITS = ["train", "valid", "test"]
def check_correspondance():
for split in SPLITS:
img_dir = os.path.join(DATASET_DIR, split, "images")
lbl_dir = os.path.join(DATASET_DIR, split, "labels")
if not os.path.exists(img_dir) or not os.path.exists(lbl_dir):
print(f"Dossier manquant dans {split}")
continue
img_files = {os.path.splitext(f)[0] for f in os.listdir(img_dir) if f.endswith(".jpg")}
lbl_files = {os.path.splitext(f)[0] for f in os.listdir(lbl_dir) if f.endswith(".txt")}
only_images = img_files - lbl_files
only_labels = lbl_files - img_files
print(f"\n🔍 Split : {split}")
print(f" - Total images : {len(img_files)}")
print(f" - Total labels : {len(lbl_files)}")
if only_images:
print(f"Images sans labels : {sorted(list(only_images))}")
if only_labels:
print(f"Labels sans images : {sorted(list(only_labels))}")
if not only_images and not only_labels:
print("Tout est cohérent.")
check_correspondance()
🔍 Split : train - Total images : 8502 - Total labels : 8502 Tout est cohérent. 🔍 Split : valid - Total images : 2424 - Total labels : 2424 Tout est cohérent. 🔍 Split : test - Total images : 1216 - Total labels : 1216 Tout est cohérent.
All the images are well assiociated with their label. It's perfect.
def check_image_sizes(dataset_dir, split="train"):
img_dir = os.path.join(dataset_dir, split, "images")
size_counter = Counter()
for fname in os.listdir(img_dir):
if not fname.endswith(".jpg"):
continue
path = os.path.join(img_dir, fname)
img = cv2.imread(path)
if img is None:
print(f"Impossible de lire l'image : {fname}")
continue
h, w = img.shape[:2]
size_counter[(h, w)] += 1
print(f"\nDimensions des images dans '{split}/images/' :")
for size, count in size_counter.items():
print(f" - {size[1]}x{size[0]} : {count} image(s)")
if len(size_counter) == 1:
print("Toutes les images ont la même taille.")
else:
print("Plusieurs tailles d'images détectées.")
check_image_sizes("PKLot.v2-640.yolov5pytorch", "train")
check_image_sizes("PKLot.v2-640.yolov5pytorch", "valid")
check_image_sizes("PKLot.v2-640.yolov5pytorch", "test")
Dimensions des images dans 'train/images/' : - 640x640 : 8502 image(s) Toutes les images ont la même taille. Dimensions des images dans 'valid/images/' : - 640x640 : 2424 image(s) Toutes les images ont la même taille. Dimensions des images dans 'test/images/' : - 640x640 : 1216 image(s) Toutes les images ont la même taille.
All the images have the same size
def check_normalized_labels(dataset_path):
splits = ['train', 'valid', 'test']
errors = []
for split in splits:
label_dir = os.path.join(dataset_path, split, 'labels')
for fname in os.listdir(label_dir):
if not fname.endswith('.txt'):
continue
fpath = os.path.join(label_dir, fname)
with open(fpath, 'r') as f:
lines = f.readlines()
for i, line in enumerate(lines):
parts = line.strip().split()
if len(parts) != 5:
errors.append((split, fname, i, "Mauvais format"))
continue
cls, x, y, w, h = parts
try:
x, y, w, h = map(float, [x, y, w, h])
cls = int(cls)
if not (0 <= x <= 1 and 0 <= y <= 1 and 0 <= w <= 1 and 0 <= h <= 1):
errors.append((split, fname, i, f"Valeurs hors [0,1] : {x}, {y}, {w}, {h}"))
except ValueError:
errors.append((split, fname, i, "Conversion impossible"))
if not errors:
print("Toutes les annotations sont bien normalisées.")
else:
print("Problèmes détectés :")
for err in errors:
print(f"[{err[0]}] {err[1]} (ligne {err[2]+1}) : {err[3]}")
check_normalized_labels("PKLot.v2-640.yolov5pytorch")
Toutes les annotations sont bien normalisées.
All the label's data are well normalized.
| features | Normalized value |
|---|---|
x_center |
entre 0 et 1 |
y_center |
entre 0 et 1 |
width |
entre 0 et 1 |
height |
entre 0 et 1 |
class_id |
entier >= 0 |
def show_random_image_with_boxes(split="train"):
img_dir = os.path.join(DATASET_DIR, split, "images")
lbl_dir = os.path.join(DATASET_DIR, split, "labels")
images = sorted([f for f in os.listdir(img_dir) if f.endswith(".jpg")])
if not images:
print("Aucune image trouvée.")
return
# Choisir une image au hasard
img_name = random.choice(images)
label_name = img_name.replace(".jpg", ".txt")
img_path = os.path.join(img_dir, img_name)
label_path = os.path.join(lbl_dir, label_name)
img = cv2.imread(img_path)
h, w = img.shape[:2]
if not os.path.exists(label_path):
print(f"⚠️ Pas de label pour l’image : {img_name}")
return
with open(label_path, "r") as f:
for line in f:
cls, x, y, bw, bh = map(float, line.strip().split())
x1 = int((x - bw / 2) * w)
y1 = int((y - bh / 2) * h)
x2 = int((x + bw / 2) * w)
y2 = int((y + bh / 2) * h)
cv2.rectangle(img, (x1, y1), (x2, y2), (0, 255, 0), 2)
cv2.putText(img, f"Class {int(cls)}", (x1, y1 - 5),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 1)
plt.figure(figsize=(8, 6))
plt.title(f"{img_name} avec bounding boxes")
plt.imshow(cv2.cvtColor(img, cv2.COLOR_BGR2RGB))
plt.axis("off")
plt.show()
show_random_image_with_boxes("train")
import os
import shutil
def backup_clean_dataset(source_dir, dest_dir="../PKLot_cleaned"):
if os.path.exists(dest_dir):
print(f"The folder '{dest_dir}' already exists. Please choose another name or delete it.")
return
shutil.copytree(source_dir, dest_dir)
print(f"Cleaned dataset copied to: {dest_dir}")
# Usage
backup_clean_dataset("PKLot.v2-640.yolov5pytorch")
Cleaned dataset copied to: ../PKLot_cleaned
First YOLO Test¶
Dataset Configuration¶
To train our YOLOv5 model, we prepared a custom dataset configuration file called data.yaml. This file defines the path to our cleaned dataset and the structure expected by the training pipeline. Here's the content of our data.yaml file:
path: ./PKLot_cleaned
train: train/images
val: valid/images
nc: 2
names:
- empty
- occupied
This means:
- The dataset is located in the PKLot_cleaned directory, one level above the project.
- Training images and labels are located in
PKLot_cleaned/train/imagesandPKLot_cleaned/train/labels. - Validation data is in
PKLot_cleaned/valid/imagesandvalid/labels. - There are two classes: empty (free parking spot) & occupied (taken parking spot)
Training Procedure¶
We used the following command to launch the training:
!yolo detect train data=/content/data.yaml model=yolov5nu.pt epochs=100 imgsz=640
We didn't run this line in this notebook, because the output was very long and not very usefull (to see an output equivalent for the train of the third YOLO model)
We trained the model for 100 epochs with an input image size of 640x640 pixels.
This setup allows the model to learn how to localize and classify individual parking spots as either free or occupied.
The result of the first YOLO model¶
Unfortunately, the results of our first training attempt were unsuccessful. After training the YOLOv5n model for 100 epochs, the model was unable to correctly detect or classify any parking spots in the validation or test images.
Despite following standard YOLOv5 training procedures and using a structured dataset, the model consistently returned no bounding boxes during inference, even when run on clearly annotated images.
We believe one of the main reasons for our model’s poor performance is the visual discrepancy between our training data and the target environment.

The first image represents the real parking lot we intended to use for testing — a small residential area with a low camera angle, partial occlusions, and varied lighting conditions.
The second image, on the other hand, is an example from the public dataset we used for training, which contains wide-angle views from elevated cameras, uniform lighting, and clearly separated parking lines.
This difference in camera angle, parking layout, background, and overall scene composition likely made it difficult for the model to generalize and correctly detect parking spots in our real-world images.
Solution¶
To address this issue, we decided to look for another dataset that more closely resembles our real parking lot. Our goal was to find images with similar camera angles, parking layouts, and environmental conditions to improve the model’s ability to generalize.
We even considered using multiple datasets and merging them into a single training set to increase diversity and robustness. By combining data from different sources, we aim to train a more resilient and accurate model, capable of performing reliably in our real-world testing environment.
Dataset 2¶
After analyzing our initial results, we decided to restart from scratch with a new approach to dataset selection. Our goal was to find training data that better matches the visual and structural characteristics of the real parking lot we plan to use for testing (see image above).
To do so, we explored the Roboflow public repository and selected three datasets that contain images with similar perspectives, parking layouts, and conditions. These datasets were:
- Parking Space Detection 1
- Car Space Find
- Parking Lot View 3
Downloading the first new dataset¶
We access the project named parking-space-pubnz-ftfle in the workspace data-a09tr.
We select version 1 of the dataset.
Then, we download it in YOLOv8 format, which creates a new folder locally (e.g., parking-space-pubnz-ftfle-1/).
from roboflow import Roboflow
rf = Roboflow(api_key="8J41NUsg0Vpt63zevgPD")
project = rf.workspace("data-a09tr").project("parking-space-pubnz-ftfle")
version = project.version(1)
dataset = version.download("yolov8")
loading Roboflow workspace... loading Roboflow project...
Downloading Dataset Version Zip in parking-space-1 to yolov8:: 100%|██████████| 456297/456297 [00:17<00:00, 25446.51it/s]
Extracting Dataset Version Zip to parking-space-1 in yolov8:: 100%|██████████| 3184/3184 [00:00<00:00, 4840.50it/s]
Downloading the second dataset¶
Same process, but now for another dataset: car-space-find-wozyb.
project = rf.workspace("data-a09tr").project("car-space-find-wozyb")
version = project.version(1)
dataset = version.download("yolov8")
loading Roboflow workspace... loading Roboflow project...
Downloading Dataset Version Zip in Car-Space-Find-1 to yolov8:: 100%|██████████| 128349/128349 [00:04<00:00, 25836.69it/s]
Extracting Dataset Version Zip to Car-Space-Find-1 in yolov8:: 100%|██████████| 1212/1212 [00:00<00:00, 6614.33it/s]
Downloading the third dataset¶
Again, same steps for a third dataset.
project = rf.workspace("data-a09tr").project("parking-space-ipm1b-dt0x2")
version = project.version(1)
dataset = version.download("yolov8")
loading Roboflow workspace... loading Roboflow project...
Downloading Dataset Version Zip in Parking-Space-1 to yolov8:: 100%|██████████| 843640/843640 [00:27<00:00, 31076.70it/s]
Extracting Dataset Version Zip to Parking-Space-1 in yolov8:: 100%|██████████| 6258/6258 [00:00<00:00, 6848.17it/s]
Function to merge datasets¶
We wrote a Python script to merge these three YOLO-format datasets into a single, unified dataset directory. The Python function do:
- Combine the image and label files from all three datasets into a single unified folder (parking_multi/)
- Preserve the correct YOLO folder structure (train/, valid/, test/)
- Add unique suffixes (e.g., _ds1, _ds2, _ds3) to all filenames to avoid naming conflicts
- Automatically generate a data.yaml file inside the merged folder, specifying the dataset configuration
The data.yaml defines:
- The base path of the dataset
- The relative paths to training and validation images
- The number of object classes (empty, occupied)
import shutil
from pathlib import Path
import yaml
def merge_yolo_datasets(source1, source2, source3, destination):
# Create folders
for split in ['train', 'valid', 'test']:
for sub in ['images', 'labels']:
Path(f"{destination}/{split}/{sub}").mkdir(parents=True, exist_ok=True)
def copy_with_suffix(src_path, dst_path, suffix):
if src_path.exists():
for file in src_path.iterdir():
new_name = file.stem + suffix + file.suffix
shutil.copy(file, dst_path / new_name)
# Copy files from all 3 sources
for split in ['train', 'valid', 'test']:
for sub in ['images', 'labels']:
dst = Path(f"{destination}/{split}/{sub}")
copy_with_suffix(Path(f"{source1}/{split}/{sub}"), dst, "_ds1")
copy_with_suffix(Path(f"{source2}/{split}/{sub}"), dst, "_ds2")
copy_with_suffix(Path(f"{source3}/{split}/{sub}"), dst, "_ds3")
# Create data.yaml
data_yaml = {
'path': destination,
'train': 'train/images',
'val': 'valid/images',
'nc': 2,
'names': ['empty', 'occupied']
}
with open(Path(destination) / 'data.yaml', 'w') as file:
yaml.dump(data_yaml, file, default_flow_style=False)
The second YOLO model¶
After merging the three selected Roboflow datasets into a new training set, we retrained our YOLOv5 model using the improved data. Our goal was to overcome the total failure of the first model, which was unable to detect any parking spots at all.
from ultralytics import YOLO
model = YOLO("yolov8n.pt")
model.train(data="parking_multi/data.yaml", epochs=50, imgsz=640)

This second version of the model produced noticeably better results. As shown in the image below, the model is now capable of detecting some parking spots and classifying them as either "occupied" or "empty". This is already a significant improvement compared to the first test, where no predictions were returned at all.
However, the overall performance remains weak:
- Several cars are not detected at all,
- Some bounding boxes were misaligned or misclassified,
- The confidence scores remain low (around 0.56–0.58).
These results suggest that while the model has started to learn some useful features, it is still far from production-ready.
Dataset 3¶
To improve this second test of model, we choose to create our own dataset with real picture of the parking we wanted to use.
First we took about 100 picture with different light, angles and time of day.
Then we had to annotated every pictures by adding bounding boxes. For that step we used humansignal.com. This website gave us the possiblity to easily mark every picture with bounding boxes. We created 2 labels (empty and occupied), to match the dataset we got from RoboFlow.

After we annoted, by hand, every picture took.

Purple for empty and red for occupied.Finaly we split the dataset between train and validation. To do that we used a function from a git repository. We choose to have 90% of picture in train dataset and 10% of picture in validation dataset.
!wget -O /content/train_val_split.py https://raw.githubusercontent.com/EdjeElectronics/Train-and-Deploy-YOLO-Models/refs/heads/main/utils/train_val_split.py
!python train_val_split.py --datapath="/content/custom_data" --train_pct=0.9
After doing all of that with merged this personalizaned dataset with the 3 we found on Roboflow
import shutil
from pathlib import Path
import yaml
def merge_yolo_datasets(source1, source2, source3, source4, destination):
# Créer la structure finale
for split in ['train', 'valid', 'test']:
for sub in ['images', 'labels']:
Path(f"{destination}/{split}/{sub}").mkdir(parents=True, exist_ok=True)
def copy_with_suffix(src_path, dst_path, suffix):
if src_path.exists():
for file in src_path.iterdir():
new_name = file.stem + suffix + file.suffix
shutil.copy(file, dst_path / new_name)
# Copier les fichiers des 4 sources
for split in ['train', 'valid', 'test']:
for sub in ['images', 'labels']:
dst = Path(f"{destination}/{split}/{sub}")
copy_with_suffix(Path(f"{source1}/{split}/{sub}"), dst, "_ds1")
copy_with_suffix(Path(f"{source2}/{split}/{sub}"), dst, "_ds2")
copy_with_suffix(Path(f"{source3}/{split}/{sub}"), dst, "_ds3")
copy_with_suffix(Path(f"{source4}/{split}/{sub}"), dst, "_ds4")
# Génération du fichier data.yaml
data_yaml = {
'path': destination,
'train': 'train/images',
'val': 'valid/images',
'nc': 2,
'names': ['empty', 'occupied']
}
with open(Path(destination) / 'data.yaml', 'w') as file:
yaml.dump(data_yaml, file, default_flow_style=False)
# Utilisation
merge_yolo_datasets(
source1="Car-Space-Find-1",
source2="parking-space-1",
source3="Parking-Space-1",
source4="data_pk_quentin",
destination="parking_multi"
)
This merges the four datasets into one combined dataset at /content/parking_multi.
Now, you can train your YOLOv8 model using the new data.yaml inside parking_multi.
The third YOLO model¶
from ultralytics import YOLO
# Load a base model (e.g., nano version)
model = YOLO("yolov8n.pt")
Downloading https://github.com/ultralytics/assets/releases/download/v8.3.0/yolov8n.pt to 'yolov8n.pt'...
100%|██████████| 6.25M/6.25M [00:00<00:00, 459MB/s]
# Train on your dataset
model.train(data="parking_multi/data.yaml", epochs=50, imgsz=640)
Ultralytics 8.3.159 🚀 Python-3.11.13 torch-2.6.0+cu124 CUDA:0 (NVIDIA L4, 22693MiB)
engine/trainer: agnostic_nms=False, amp=True, augment=False, auto_augment=randaugment, batch=16, bgr=0.0, box=7.5, cache=False, cfg=None, classes=None, close_mosaic=10, cls=0.5, conf=None, copy_paste=0.0, copy_paste_mode=flip, cos_lr=False, cutmix=0.0, data=parking_multi/data.yaml, degrees=0.0, deterministic=True, device=None, dfl=1.5, dnn=False, dropout=0.0, dynamic=False, embed=None, epochs=50, erasing=0.4, exist_ok=False, fliplr=0.5, flipud=0.0, format=torchscript, fraction=1.0, freeze=None, half=False, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, imgsz=640, int8=False, iou=0.7, keras=False, kobj=1.0, line_width=None, lr0=0.01, lrf=0.01, mask_ratio=4, max_det=300, mixup=0.0, mode=train, model=yolov8n.pt, momentum=0.937, mosaic=1.0, multi_scale=False, name=train, nbs=64, nms=False, opset=None, optimize=False, optimizer=auto, overlap_mask=True, patience=100, perspective=0.0, plots=True, pose=12.0, pretrained=True, profile=False, project=None, rect=False, resume=False, retina_masks=False, save=True, save_conf=False, save_crop=False, save_dir=runs/detect/train, save_frames=False, save_json=False, save_period=-1, save_txt=False, scale=0.5, seed=0, shear=0.0, show=False, show_boxes=True, show_conf=True, show_labels=True, simplify=True, single_cls=False, source=None, split=val, stream_buffer=False, task=detect, time=None, tracker=botsort.yaml, translate=0.1, val=True, verbose=True, vid_stride=1, visualize=False, warmup_bias_lr=0.1, warmup_epochs=3.0, warmup_momentum=0.8, weight_decay=0.0005, workers=8, workspace=None
Downloading https://ultralytics.com/assets/Arial.ttf to '/root/.config/Ultralytics/Arial.ttf'...
100%|██████████| 755k/755k [00:00<00:00, 93.4MB/s]
Overriding model.yaml nc=80 with nc=2
from n params module arguments
0 -1 1 464 ultralytics.nn.modules.conv.Conv [3, 16, 3, 2]
1 -1 1 4672 ultralytics.nn.modules.conv.Conv [16, 32, 3, 2]
2 -1 1 7360 ultralytics.nn.modules.block.C2f [32, 32, 1, True]
3 -1 1 18560 ultralytics.nn.modules.conv.Conv [32, 64, 3, 2]
4 -1 2 49664 ultralytics.nn.modules.block.C2f [64, 64, 2, True]
5 -1 1 73984 ultralytics.nn.modules.conv.Conv [64, 128, 3, 2]
6 -1 2 197632 ultralytics.nn.modules.block.C2f [128, 128, 2, True]
7 -1 1 295424 ultralytics.nn.modules.conv.Conv [128, 256, 3, 2]
8 -1 1 460288 ultralytics.nn.modules.block.C2f [256, 256, 1, True]
9 -1 1 164608 ultralytics.nn.modules.block.SPPF [256, 256, 5]
10 -1 1 0 torch.nn.modules.upsampling.Upsample [None, 2, 'nearest']
11 [-1, 6] 1 0 ultralytics.nn.modules.conv.Concat [1]
12 -1 1 148224 ultralytics.nn.modules.block.C2f [384, 128, 1]
13 -1 1 0 torch.nn.modules.upsampling.Upsample [None, 2, 'nearest']
14 [-1, 4] 1 0 ultralytics.nn.modules.conv.Concat [1]
15 -1 1 37248 ultralytics.nn.modules.block.C2f [192, 64, 1]
16 -1 1 36992 ultralytics.nn.modules.conv.Conv [64, 64, 3, 2]
17 [-1, 12] 1 0 ultralytics.nn.modules.conv.Concat [1]
18 -1 1 123648 ultralytics.nn.modules.block.C2f [192, 128, 1]
19 -1 1 147712 ultralytics.nn.modules.conv.Conv [128, 128, 3, 2]
20 [-1, 9] 1 0 ultralytics.nn.modules.conv.Concat [1]
21 -1 1 493056 ultralytics.nn.modules.block.C2f [384, 256, 1]
22 [15, 18, 21] 1 751702 ultralytics.nn.modules.head.Detect [2, [64, 128, 256]]
Model summary: 129 layers, 3,011,238 parameters, 3,011,222 gradients, 8.2 GFLOPs
Transferred 319/355 items from pretrained weights
Freezing layer 'model.22.dfl.conv.weight'
AMP: running Automatic Mixed Precision (AMP) checks...
Downloading https://github.com/ultralytics/assets/releases/download/v8.3.0/yolo11n.pt to 'yolo11n.pt'...
100%|██████████| 5.35M/5.35M [00:00<00:00, 379MB/s]
AMP: checks passed ✅ train: Fast image access ✅ (ping: 0.0±0.0 ms, read: 2889.6±769.7 MB/s, size: 244.5 KB)
train: Scanning /content/parking_multi/train/labels... 3995 images, 147 backgrounds, 0 corrupt: 100%|██████████| 3995/3995 [00:03<00:00, 1175.72it/s]
train: /content/parking_multi/train/images/4878f6ff__6f1ecdf8-20250623_192235_ds4.jpg: corrupt JPEG restored and saved train: /content/parking_multi/train/images/9d1704a7__5325d1f4-20250623_191251_ds4.jpg: corrupt JPEG restored and saved train: /content/parking_multi/train/images/b2dc4b08__9c412855-20250623_193330_ds4.jpg: corrupt JPEG restored and saved train: New cache created: /content/parking_multi/train/labels.cache WARNING ⚠️ Box and segment counts should be equal, but got len(segments) = 37175, len(boxes) = 65124. To resolve this only boxes will be used and all segments will be removed. To avoid this please supply either a detect or segment dataset, not a detect-segment mixed dataset.
albumentations: Blur(p=0.01, blur_limit=(3, 7)), MedianBlur(p=0.01, blur_limit=(3, 7)), ToGray(p=0.01, method='weighted_average', num_output_channels=3), CLAHE(p=0.01, clip_limit=(1.0, 4.0), tile_grid_size=(8, 8)) val: Fast image access ✅ (ping: 0.0±0.0 ms, read: 891.5±652.8 MB/s, size: 99.5 KB)
val: Scanning /content/parking_multi/valid/labels... 1004 images, 30 backgrounds, 0 corrupt: 100%|██████████| 1004/1004 [00:01<00:00, 815.92it/s]
val: New cache created: /content/parking_multi/valid/labels.cache WARNING ⚠️ Box and segment counts should be equal, but got len(segments) = 10394, len(boxes) = 16978. To resolve this only boxes will be used and all segments will be removed. To avoid this please supply either a detect or segment dataset, not a detect-segment mixed dataset. Plotting labels to runs/detect/train/labels.jpg... optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 57 weight(decay=0.0), 64 weight(decay=0.0005), 63 bias(decay=0.0) Image sizes 640 train, 640 val Using 8 dataloader workers Logging results to runs/detect/train Starting training for 50 epochs... Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
1/50 3.45G 1.307 1.351 1.227 339 640: 100%|██████████| 250/250 [00:31<00:00, 8.05it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:06<00:00, 4.84it/s]
all 1004 16978 0.879 0.804 0.889 0.659
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
2/50 3.47G 0.9495 0.7644 1.061 309 640: 100%|██████████| 250/250 [00:28<00:00, 8.83it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:05<00:00, 6.27it/s]
all 1004 16978 0.917 0.839 0.924 0.702
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
3/50 3.49G 0.8888 0.6841 1.03 487 640: 100%|██████████| 250/250 [00:28<00:00, 8.78it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:05<00:00, 6.31it/s]
all 1004 16978 0.917 0.879 0.939 0.733
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
4/50 3.51G 0.8318 0.6261 1.019 320 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.67it/s]
all 1004 16978 0.898 0.88 0.935 0.738
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
5/50 3.95G 0.8038 0.5795 1.005 351 640: 100%|██████████| 250/250 [00:27<00:00, 9.14it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.66it/s]
all 1004 16978 0.921 0.903 0.954 0.769
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
6/50 4.39G 0.7636 0.5504 0.9943 428 640: 100%|██████████| 250/250 [00:27<00:00, 9.16it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.83it/s]
all 1004 16978 0.926 0.903 0.952 0.769
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
7/50 4.41G 0.7508 0.5303 0.9883 201 640: 100%|██████████| 250/250 [00:27<00:00, 9.05it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.89it/s]
all 1004 16978 0.929 0.925 0.961 0.796
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
8/50 4.42G 0.7574 0.525 0.99 184 640: 100%|██████████| 250/250 [00:27<00:00, 9.12it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.71it/s]
all 1004 16978 0.912 0.901 0.954 0.788
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
9/50 4.9G 0.7333 0.5104 0.9865 440 640: 100%|██████████| 250/250 [00:27<00:00, 9.15it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.91it/s]
all 1004 16978 0.93 0.912 0.961 0.8
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
10/50 4.92G 0.7009 0.4817 0.9701 330 640: 100%|██████████| 250/250 [00:27<00:00, 9.19it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.73it/s]
all 1004 16978 0.921 0.932 0.965 0.814
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
11/50 4.94G 0.6801 0.472 0.9646 307 640: 100%|██████████| 250/250 [00:27<00:00, 9.12it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.68it/s]
all 1004 16978 0.93 0.918 0.959 0.802
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
12/50 4.95G 0.6689 0.4584 0.9622 542 640: 100%|██████████| 250/250 [00:27<00:00, 9.05it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.89it/s]
all 1004 16978 0.928 0.933 0.965 0.816
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
13/50 4.97G 0.6676 0.4582 0.9629 350 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.98it/s]
all 1004 16978 0.937 0.923 0.966 0.816
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
14/50 4.99G 0.6789 0.4561 0.9626 388 640: 100%|██████████| 250/250 [00:27<00:00, 9.10it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.95it/s]
all 1004 16978 0.932 0.931 0.967 0.819
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
15/50 5.52G 0.6801 0.4553 0.9661 265 640: 100%|██████████| 250/250 [00:27<00:00, 9.18it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.90it/s]
all 1004 16978 0.932 0.933 0.966 0.823
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
16/50 5.53G 0.6426 0.4294 0.9549 169 640: 100%|██████████| 250/250 [00:27<00:00, 9.16it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.98it/s]
all 1004 16978 0.945 0.929 0.968 0.83
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
17/50 5.55G 0.6363 0.4237 0.9509 290 640: 100%|██████████| 250/250 [00:27<00:00, 9.12it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.86it/s]
all 1004 16978 0.942 0.932 0.97 0.832
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
18/50 5.57G 0.6419 0.4193 0.9522 284 640: 100%|██████████| 250/250 [00:27<00:00, 9.10it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.02it/s]
all 1004 16978 0.944 0.943 0.972 0.833
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
19/50 5.59G 0.6378 0.4202 0.9489 362 640: 100%|██████████| 250/250 [00:27<00:00, 9.16it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.01it/s]
all 1004 16978 0.938 0.931 0.971 0.832
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
20/50 5.6G 0.6299 0.4129 0.9496 216 640: 100%|██████████| 250/250 [00:27<00:00, 9.07it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.74it/s]
all 1004 16978 0.938 0.93 0.969 0.833
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
21/50 5.62G 0.6056 0.4015 0.9369 389 640: 100%|██████████| 250/250 [00:27<00:00, 9.05it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.94it/s]
all 1004 16978 0.948 0.934 0.97 0.826
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
22/50 5.64G 0.6067 0.4009 0.941 420 640: 100%|██████████| 250/250 [00:27<00:00, 9.11it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.82it/s]
all 1004 16978 0.946 0.937 0.971 0.837
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
23/50 5.65G 0.6088 0.403 0.937 488 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.06it/s]
all 1004 16978 0.942 0.943 0.971 0.834
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
24/50 5.67G 0.6108 0.4018 0.9448 172 640: 100%|██████████| 250/250 [00:27<00:00, 9.05it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.03it/s]
all 1004 16978 0.947 0.939 0.972 0.844
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
25/50 5.69G 0.6025 0.3965 0.9377 435 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.97it/s]
all 1004 16978 0.952 0.943 0.976 0.847
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
26/50 5.71G 0.5846 0.3874 0.9317 221 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.04it/s]
all 1004 16978 0.951 0.938 0.973 0.842
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
27/50 5.72G 0.5983 0.3909 0.9356 318 640: 100%|██████████| 250/250 [00:27<00:00, 9.15it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.82it/s]
all 1004 16978 0.943 0.939 0.973 0.847
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
28/50 5.74G 0.5696 0.3734 0.9281 279 640: 100%|██████████| 250/250 [00:27<00:00, 9.13it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.04it/s]
all 1004 16978 0.953 0.944 0.974 0.848
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
29/50 5.76G 0.5743 0.3761 0.9262 415 640: 100%|██████████| 250/250 [00:27<00:00, 9.13it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.91it/s]
all 1004 16978 0.954 0.94 0.974 0.853
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
30/50 5.77G 0.5715 0.3805 0.9265 733 640: 100%|██████████| 250/250 [00:27<00:00, 9.07it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.84it/s]
all 1004 16978 0.952 0.943 0.975 0.856
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
31/50 5.79G 0.5681 0.3776 0.9219 240 640: 100%|██████████| 250/250 [00:27<00:00, 9.08it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.93it/s]
all 1004 16978 0.949 0.936 0.976 0.857
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
32/50 5.81G 0.5683 0.3826 0.9265 257 640: 100%|██████████| 250/250 [00:27<00:00, 9.12it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.87it/s]
all 1004 16978 0.951 0.936 0.974 0.853
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
33/50 5.82G 0.5708 0.381 0.926 328 640: 100%|██████████| 250/250 [00:27<00:00, 9.14it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.01it/s]
all 1004 16978 0.952 0.943 0.976 0.854
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
34/50 5.84G 0.5685 0.3805 0.9224 329 640: 100%|██████████| 250/250 [00:27<00:00, 9.05it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.93it/s]
all 1004 16978 0.953 0.944 0.977 0.859
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
35/50 5.86G 0.568 0.3859 0.9199 488 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.90it/s]
all 1004 16978 0.947 0.948 0.975 0.859
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
36/50 5.88G 0.5446 0.3636 0.9138 505 640: 100%|██████████| 250/250 [00:27<00:00, 9.00it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.14it/s]
all 1004 16978 0.945 0.946 0.976 0.865
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
37/50 5.89G 0.5456 0.3689 0.9128 397 640: 100%|██████████| 250/250 [00:27<00:00, 9.22it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.90it/s]
all 1004 16978 0.934 0.938 0.974 0.86
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
38/50 5.91G 0.5288 0.363 0.9074 350 640: 100%|██████████| 250/250 [00:27<00:00, 9.09it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.04it/s]
all 1004 16978 0.943 0.94 0.975 0.861
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
39/50 5.93G 0.5396 0.369 0.9124 517 640: 100%|██████████| 250/250 [00:27<00:00, 9.15it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.86it/s]
all 1004 16978 0.945 0.937 0.976 0.865
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
40/50 5.94G 0.5316 0.3606 0.9067 421 640: 100%|██████████| 250/250 [00:27<00:00, 9.19it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.97it/s]
all 1004 16978 0.949 0.942 0.976 0.867
Closing dataloader mosaic
albumentations: Blur(p=0.01, blur_limit=(3, 7)), MedianBlur(p=0.01, blur_limit=(3, 7)), ToGray(p=0.01, method='weighted_average', num_output_channels=3), CLAHE(p=0.01, clip_limit=(1.0, 4.0), tile_grid_size=(8, 8))
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
41/50 5.96G 0.5461 0.3542 0.914 206 640: 100%|██████████| 250/250 [00:26<00:00, 9.31it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.91it/s]
all 1004 16978 0.951 0.94 0.976 0.866
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
42/50 5.98G 0.536 0.3448 0.9106 264 640: 100%|██████████| 250/250 [00:25<00:00, 9.73it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.98it/s]
all 1004 16978 0.962 0.946 0.978 0.848
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
43/50 6G 0.5262 0.3413 0.909 119 640: 100%|██████████| 250/250 [00:25<00:00, 9.83it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.89it/s]
all 1004 16978 0.953 0.944 0.976 0.869
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
44/50 6.01G 0.5171 0.3352 0.8988 173 640: 100%|██████████| 250/250 [00:25<00:00, 9.85it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.01it/s]
all 1004 16978 0.952 0.94 0.977 0.872
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
45/50 6.03G 0.5058 0.3289 0.8987 163 640: 100%|██████████| 250/250 [00:25<00:00, 9.78it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.95it/s]
all 1004 16978 0.955 0.941 0.977 0.872
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
46/50 6.04G 0.5066 0.3245 0.8966 150 640: 100%|██████████| 250/250 [00:25<00:00, 9.83it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.18it/s]
all 1004 16978 0.958 0.943 0.978 0.875
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
47/50 6.06G 0.4978 0.3206 0.8937 174 640: 100%|██████████| 250/250 [00:25<00:00, 9.83it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.06it/s]
all 1004 16978 0.954 0.945 0.979 0.872
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
48/50 6.08G 0.4932 0.3218 0.8908 199 640: 100%|██████████| 250/250 [00:25<00:00, 9.87it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.91it/s]
all 1004 16978 0.951 0.948 0.977 0.875
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
49/50 6.1G 0.4774 0.313 0.8827 202 640: 100%|██████████| 250/250 [00:25<00:00, 9.86it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 7.04it/s]
all 1004 16978 0.955 0.943 0.978 0.877
Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size
50/50 6.11G 0.4749 0.3137 0.8826 182 640: 100%|██████████| 250/250 [00:25<00:00, 9.82it/s]
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:04<00:00, 6.96it/s]
all 1004 16978 0.96 0.946 0.979 0.879
50 epochs completed in 0.447 hours. Optimizer stripped from runs/detect/train/weights/last.pt, 6.2MB Optimizer stripped from runs/detect/train/weights/best.pt, 6.2MB Validating runs/detect/train/weights/best.pt... Ultralytics 8.3.159 🚀 Python-3.11.13 torch-2.6.0+cu124 CUDA:0 (NVIDIA L4, 22693MiB) Model summary (fused): 72 layers, 3,006,038 parameters, 0 gradients, 8.1 GFLOPs
Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 32/32 [00:06<00:00, 5.28it/s]
all 1004 16978 0.96 0.946 0.979 0.879
empty 916 5957 0.945 0.931 0.973 0.827
occupied 884 11021 0.976 0.962 0.985 0.931
Speed: 0.1ms preprocess, 0.6ms inference, 0.0ms loss, 1.0ms postprocess per image
Results saved to runs/detect/train
ultralytics.utils.metrics.DetMetrics object with attributes:
ap_class_index: array([0, 1])
box: ultralytics.utils.metrics.Metric object
confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x7e7d9a79ac50>
curves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)']
curves_results: [[array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,
0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,
0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,
0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,
0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,
0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,
0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,
0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,
0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,
0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,
0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,
0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,
0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,
0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,
0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,
0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,
0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,
0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,
0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,
0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,
0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,
0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,
0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,
0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,
0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,
0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,
0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,
0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,
0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,
0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,
0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,
0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,
0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,
0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,
0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,
0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,
0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,
0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,
0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,
0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,
0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,
0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 1, 1, 1, ..., 0.037944, 0.018972, 0],
[ 1, 1, 1, ..., 0.069512, 0.034756, 0]]), 'Recall', 'Precision'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,
0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,
0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,
0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,
0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,
0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,
0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,
0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,
0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,
0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,
0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,
0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,
0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,
0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,
0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,
0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,
0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,
0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,
0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,
0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,
0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,
0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,
0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,
0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,
0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,
0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,
0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,
0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,
0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,
0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,
0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,
0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,
0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,
0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,
0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,
0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,
0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,
0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,
0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,
0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,
0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,
0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.38024, 0.38024, 0.48433, ..., 0, 0, 0],
[ 0.53041, 0.53041, 0.64859, ..., 0, 0, 0]]), 'Confidence', 'F1'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,
0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,
0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,
0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,
0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,
0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,
0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,
0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,
0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,
0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,
0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,
0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,
0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,
0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,
0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,
0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,
0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,
0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,
0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,
0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,
0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,
0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,
0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,
0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,
0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,
0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,
0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,
0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,
0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,
0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,
0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,
0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,
0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,
0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,
0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,
0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,
0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,
0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,
0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,
0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,
0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,
0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.23544, 0.23544, 0.32102, ..., 1, 1, 1],
[ 0.3623, 0.3623, 0.48255, ..., 1, 1, 1]]), 'Confidence', 'Precision'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,
0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,
0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,
0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,
0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,
0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,
0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,
0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,
0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,
0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,
0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,
0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,
0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,
0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,
0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,
0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,
0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,
0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,
0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,
0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,
0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,
0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,
0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,
0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,
0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,
0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,
0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,
0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,
0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,
0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,
0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,
0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,
0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,
0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,
0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,
0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,
0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,
0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,
0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,
0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,
0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,
0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.98758, 0.98758, 0.9859, ..., 0, 0, 0],
[ 0.98957, 0.98957, 0.98884, ..., 0, 0, 0]]), 'Confidence', 'Recall']]
fitness: np.float64(0.8887266071637551)
keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']
maps: array([ 0.82659, 0.93084])
names: {0: 'empty', 1: 'occupied'}
nt_per_class: array([ 5957, 11021])
nt_per_image: array([916, 884])
results_dict: {'metrics/precision(B)': np.float64(0.9604460129648562), 'metrics/recall(B)': np.float64(0.9461865321794785), 'metrics/mAP50(B)': np.float64(0.978841666784843), 'metrics/mAP50-95(B)': np.float64(0.878713822761412), 'fitness': np.float64(0.8887266071637551)}
save_dir: PosixPath('runs/detect/train')
speed: {'preprocess': 0.11560371513947772, 'inference': 0.5652818675300469, 'loss': 0.00028586454159285426, 'postprocess': 0.9873207529876545}
stats: {'tp': [], 'conf': [], 'pred_cls': [], 'target_cls': [], 'target_img': []}
task: 'detect'
import torchvision
import torchvision.transforms as transforms
import numpy as np
import tensorflow as tf
def build_fc_model():
fc_model = tf.keras.Sequential([
# First define a Flatten layer
tf.keras.layers.Flatten(input_shape=(32, 32, 3)),
# Define the first fully connected (Dense) layer.'''
tf.keras.layers.Dense(512, activation=tf.nn.relu),
# Define the second fully connected (Dense) layer.'''
tf.keras.layers.Dense(128, activation=tf.nn.relu),
#Define the second Dense layer to output the classification probabilities'''
tf.keras.layers.Dense(10, activation='softmax')
])
return fc_model
model = build_fc_model()
2025-07-02 17:37:42.837486: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. 2025-07-02 17:37:42.856990: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered WARNING: All log messages before absl::InitializeLog() is called are written to STDERR E0000 00:00:1751470662.885640 45532 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered E0000 00:00:1751470662.894235 45532 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2025-07-02 17:37:42.925129: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. /home/maelwenn/EFREI/S8/DAI/Neural_Networks_and_Deep_Learning/Project_neural_net/.venv/lib/python3.12/site-packages/keras/src/layers/reshaping/flatten.py:37: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead. super().__init__(**kwargs) 2025-07-02 17:37:45.193150: E external/local_xla/xla/stream_executor/cuda/cuda_driver.cc:152] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected
from tensorflow.keras.optimizers import SGD
model.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=1e-1),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ flatten (Flatten) │ (None, 3072) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense (Dense) │ (None, 512) │ 1,573,376 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 128) │ 65,664 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_2 (Dense) │ (None, 10) │ 1,290 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 1,640,330 (6.26 MB)
Trainable params: 1,640,330 (6.26 MB)
Non-trainable params: 0 (0.00 B)
BATCH_SIZE = 64
EPOCHS = 10
model.fit(train_images, train_labels, epochs=EPOCHS, validation_batch_size=BATCH_SIZE)
With 3 Roboflow dataset + our personalized dataset, using yolov8, our model was much more better :

With the help of 3 roboflow dataset plus our own personalize dataset, we were able to fine-tuned a YOLOv8 model.
This model is now highly accurate on the parking lot we plan to use to test on real life condition.
This project gave us a lot of new knowleadge on both tensorflow and YOLO Deep CNN model.
We were able to created our own dataset, we also learn to use public dataset found on roboflow.